Convergence properties of gradient descent noise reduction
نویسندگان
چکیده
Gradient descent noise reduction is a technique that attempts to recover the true signal, or trajectory, from noisy observations of a non-linear dynamical system for which the dynamics are known. This paper provides the first rigorous proof that the algorithm will recover the original trajectory for a broad class of dynamical systems under certain conditions. The proof is obtained using ideas from linearisation theory. Since the first introduction of the algorithm it has been recognised that the algorithm can fail to recover the true trajectory, and it has been suggested that this is a practical or numerical limitation that is a consequence of near tangencies between stable and unstable manifolds. This paper demonstrates through numerical experiments and details of the proof that the situation is worse than expected in that near tangencies impose essential limitations on noise reduction, not just practical or numerical limitations. That is, gradient descent noise reduction will sometimes fail to recover the true trajectory, even with unlimited, perfect computation. On the other hand, the numerical experiments suggest that the gradient descent noise-reduction algorithm will always recover a trajectory that is entirely consistent with the evidence provided by the observations, that is, it attains the best that can be achieved given the observations. It is argued that near tangencies will therefore impose the same limitations on any noise-reduction algorithm. © 2002 Elsevier Science B.V. All rights reserved.
منابع مشابه
How much gradient noise does a gradient-based linesearch method tolerate?
Among numerical methods for smooth unconstrained optimization, gradient-based linesearch methods, like quasi-Newton methods, may work quite well even in the presence of relatively high amplitude noise in the gradient of the objective function. We present some properties on the amplitude of this noise which ensure a descent direction for such a method. Exploiting this bound, we also discuss cond...
متن کاملEfficient algorithm for Speech Enhancement using Adaptive filter
The present system of speech enhancement is developing by adaptive filtering approach in digital filters. The adaptive filter utilizes the least mean square algorithm for noise removal, but in practical application of LMS algorithm, a key parameter is the step size. As it is known, if the step size is large, the convergence rate of LMS algorithm will be rapid, but the steady-state mean square e...
متن کاملContraction analysis of nonlinear random dynamical systems
In order to bring contraction analysis into the very fruitful and topical fields of stochastic and Bayesian systems, we extend here the theory describes in [Lohmiller and Slotine, 1998] to random differential equations. We propose new definitions of contraction (almost sure contraction and contraction in mean square) which allow to master the evolution of a stochastic system in two manners. The...
متن کاملShadowing Pseudo-Orbits and Gradient Descent Noise Reduction
Shadowing trajectories are one of the most powerful ideas of modern dynamical systems theory, providing a tool for proving some central theorems and a means to assess the relevance of models and numerically computed trajectories of chaotic systems. Shadowing has also been seen to have a role in state estimation and forecasting of nonlinear systems. Shadowing trajectories are guaranteed to exist...
متن کاملVariance Reduced Stochastic Gradient Descent with Sufficient Decrease
In this paper, we propose a novel sufficient decrease technique for variance reduced stochastic gradient descent methods such as SAG, SVRG and SAGA. In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion, which yields sufficient decrease versions of variance reduction algorithms such as SVRG-SD and SAGA-SD as a byproduct. We introduce a c...
متن کامل